Sauerkrautlm Mixtral 8x7B GGUF
Apache-2.0
SauerkrautLM Mixtral 8X7B is a multilingual text generation model based on the Mixtral architecture. It has been fine-tuned and aligned using SFT and DPO, and supports English, German, French, Italian, and Spanish.
Large Language Model
Transformers Supports Multiple Languages